Pruning Adaptive Boosting *** Icml-97 Final Draft ***
نویسنده
چکیده
The boosting algorithm AdaBoost de veloped by Freund and Schapire has ex hibited outstanding performance on sev eral benchmark problems when using C as the weak algorithm to be boosted Like other ensemble learning approaches AdaBoost constructs a composite hy pothesis by voting many individual hy potheses In practice the large amount of memory required to store these hypotheses can make ensemble methods hard to deploy in applications This paper shows that by selecting a subset of the hypotheses it is possible to obtain nearly the same levels of performance as the entire set The results also provide some insight into the behavior of AdaBoost
منابع مشابه
Boosting Lazy Decision Trees
This paper explores the problem of how to construct lazy decision tree ensembles. We present and empirically evaluate a relevancebased boosting-style algorithm that builds a lazy decision tree ensemble customized for each test instance. From the experimental results, we conclude that our boosting-style algorithm significantly improves the performance of the base learner. An empirical comparison...
متن کاملQuickly Boosting Decision Trees - Pruning Underachieving Features Early
Boosted decision trees are among the most popular learning techniques in use today. While exhibiting fast speeds at test time, relatively slow training renders them impractical for applications with real-time learning requirements. We propose a principled approach to overcome this drawback. We prove a bound on the error of a decision stump given its preliminary error on a subset of the training...
متن کاملAn Empirical Comparison of Pruning Methods for Ensemble Classifiers
Many researchers have shown that ensemble methods such as Boosting and Bagging improve the accuracy of classification. Boosting and Bagging perform well with unstable learning algorithms such as neural networks or decision trees. Pruning decision tree classifiers is intended to make trees simpler and more comprehensible and avoid over-fitting. However it is known that pruning individual classif...
متن کاملAdaptive Predictive Controllers Using a Growing and Pruning RBF Neural Network
An adaptive version of growing and pruning RBF neural network has been used to predict the system output and implement Linear Model-Based Predictive Controller (LMPC) and Non-linear Model-based Predictive Controller (NMPC) strategies. A radial-basis neural network with growing and pruning capabilities is introduced to carry out on-line model identification.An Unscented Kal...
متن کاملOn the Boosting Pruning Problem (short Submission)
Boosting is a powerful method for improving the predictive accuracy of classiiers. The AdaBoost algorithm of Freund and Schapire has been successfully applied to many domains 2, 10, 12] and the combination of AdaBoost with the C4.5 decision tree algorithm has been called the best oo-the-shelf learning algorithm in practice. Unfortunately, in some applications, the number of decision trees requi...
متن کامل